Skip to content

Avoid bun memory leak bug from TransformStream#4255

Draft
TheodoreSpeaks wants to merge 3 commits intostagingfrom
fix/memory-leak
Draft

Avoid bun memory leak bug from TransformStream#4255
TheodoreSpeaks wants to merge 3 commits intostagingfrom
fix/memory-leak

Conversation

@TheodoreSpeaks
Copy link
Copy Markdown
Collaborator

@TheodoreSpeaks TheodoreSpeaks commented Apr 22, 2026

Summary

Bun has a bug in TransformStream: oven-sh/bun#28035 causing high memory consumption due to no backpressure signal. We use this in

Type of Change

  • Bug fix
  • New feature
  • Breaking change
  • Documentation
  • Other: ___________

Testing

  • Validated that streaming responses still work fine with an agent block in a workflow locally.

Checklist

  • Code follows project style guidelines
  • Self-reviewed my changes
  • Tests added/updated and passing
  • No new warnings introduced
  • I confirm that I have read and agree to the terms outlined in the Contributor License Agreement (CLA)

Screenshots/Videos

@vercel
Copy link
Copy Markdown

vercel Bot commented Apr 22, 2026

The latest updates on your projects. Learn more about Vercel for GitHub.

1 Skipped Deployment
Project Deployment Actions Updated (UTC)
docs Skipped Skipped Apr 23, 2026 10:17pm

Request Review

@TheodoreSpeaks
Copy link
Copy Markdown
Collaborator Author

@BugBot review

@cursor
Copy link
Copy Markdown

cursor Bot commented Apr 22, 2026

PR Summary

Medium Risk
Touches core streaming execution plumbing and how agent responses are persisted, so regressions could break SSE streaming or produce missing/incorrect block outputs; scope is limited to streaming paths with added safeguards.

Overview
Avoids Bun TransformStream memory issues by changing how streaming block outputs are consumed and persisted.

BlockExecutor.handleStreamingExecution no longer tee()s the source stream; it now reads via a single getReader(), forwards bytes to the client stream, assembles full text after drain, and skips output/memory persistence if the source wasn’t fully drained.

Agent streaming memory persistence is moved from memoryService.wrapStreamForPersistence() (removed) to a new optional StreamingExecution.onFullContent callback, which is invoked after the stream drains to append the assistant message to memory.

Reviewed by Cursor Bugbot for commit bf2f9af. Bugbot is set up for automated code reviews on this repo. Configure here.

Comment thread apps/sim/executor/execution/block-executor.ts
Previously, if the onStream consumer caught an internal error without
re-throwing, the block-executor would treat the shortened accumulator
as the complete response, persist a truncated string to memory via
appendToMemory, and set it as executionOutput.content.

Track whether the source ReadableStream actually closed (done=true) in
the pull handler. If onStream returns before the source drains, skip
content persistence and log a warning — the old tee()-based flow was
immune to this because the executor branch drained independently of
the client branch.

Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
@TheodoreSpeaks
Copy link
Copy Markdown
Collaborator Author

@BugBot review

Copy link
Copy Markdown

@cursor cursor Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

✅ Bugbot reviewed your changes and found no new issues!

Comment @cursor review or bugbot run to trigger another review on this PR

Reviewed by Cursor Bugbot for commit bf2f9af. Configure here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant